skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Zhang, Yunxiang"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available August 9, 2026
  2. Computer-generated holography (CGH) simulates the propagation and interference of complex light waves, allowing it to reconstruct realistic images captured from a specific viewpoint by solving the corresponding Maxwell equations. However, in applications such as virtual and augmented reality, viewers should freely observe holograms from arbitrary viewpoints, much as how we naturally see the physical world. In this work, we train a neural network to generate holograms at any view in a scene. Our result is the Neural Holographic Field: the first artificial-neural-network-based representation for light wave propagation in free space and transform sparse 2D photos into holograms that are not only 3D but also freely viewable from any perspective. We demonstrate by visualizing various smartphone-captured scenes from arbitrary six-degree-of-freedom viewpoints on a prototype holographic display. To this end, we encode the measured light intensity from photos into a neural network representation of underlying wavefields. Our method implicitly learns the amplitude and phase surrogates of the underlying incoherent light waves under coherent light display conditions. During playback, the learned model predicts the underlying continuous complex wavefront propagating to arbitrary views to generate holograms. 
    more » « less
  3. Free, publicly-accessible full text available July 26, 2026
  4. Free, publicly-accessible full text available November 1, 2025
  5. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  6. Abstract Bacteria‐powered biobatteries using multiple microbial species under well‐mixed conditions demonstrate a temporary performance enhancement through their cooperative interaction, where one species produces a resource that another species needs but cannot synthesize. Despite excitement about the artificial microbial consortium, those mixed populations cannot be robust to environmental changes and have difficulty generating long‐lasting power because individual species compete with their neighbors for space and resources. In nature, microbial communities are organized spatially as multiple species are separated by a few hundred micrometers to balance their interaction and competition. However, it has been challenging to define a microscale spatial microbial structure in miniature biobatteries. Here, an innovative technique to design microscale spatial structures with microbial multispecies for significant improvement of the biobattery performance is demonstrated. A solid‐state layer‐by‐layer agar‐based culture platform is proposed, where individual microcolonies separately confined in microscale agar layers form a 3‐D spatial structure allowing for the exchange of metabolites without physical contact between the individual species. The optimized microbial co‐cultures are determined from selected hypothesis‐driven naturally‐occurring bacteria. Vertically and horizontally structured 3‐D microbial communities in solid‐state agar‐based microcompartments demonstrate the practicability of the biobattery, generating longer and greater power in a more self‐sustaining manner than monocultures and other mixed populations. 
    more » « less